Goto

Collaborating Authors

 loopy belief propagation





A Complete Variational Tracker

Ryan D. Turner, Steven Bottone, Bhargav Avasarala

Neural Information Processing Systems

We introduce a novel probabilistic tracking algorithm that incorporates combinatorial data association constraints and model-based track management using variational Bayes. We use a Bethe entropy approximation to incorporate data association constraints that are often ignored in previous probabilistic tracking algorithms. Noteworthy aspects of our method include a model-based mechanism to replace heuristic logic typically used to initiate and destroy tracks, and an assignment posterior with linear computation cost in window length as opposed to the exponential scaling of previous MAP-based approaches. We demonstrate the applicability of our method on radar tracking and computer vision problems. The field of tracking is broad and possesses many applications, particularly in radar/sonar [1], robotics [14], and computer vision [3].



A Complete Variational Tracker

Ryan D. Turner, Steven Bottone, Bhargav Avasarala

Neural Information Processing Systems

We introduce a novel probabilistic tracking algorithm that incorporates combinatorial data association constraints and model-based track management using variational Bayes. We use a Bethe entropy approximation to incorporate data association constraints that are often ignored in previous probabilistic tracking algorithms. Noteworthy aspects of our method include a model-based mechanism to replace heuristic logic typically used to initiate and destroy tracks, and an assignment posterior with linear computation cost in window length as opposed to the exponential scaling of previous MAP-based approaches. We demonstrate the applicability of our method on radar tracking and computer vision problems. The field of tracking is broad and possesses many applications, particularly in radar/sonar [1], robotics [14], and computer vision [3].


Learning unbelievable probabilities Yashar Ahmadian Department of Brain and Cognitive Science Center for Theoretical Neuroscience University of Rochester Columbia University Rochester, NY14607

Neural Information Processing Systems

Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm.


A Complete Variational Tracker

Neural Information Processing Systems

We introduce a novel probabilistic tracking algorithm that incorporates combinatorial data association constraints and model-based track management using variational Bayes. We use a Bethe entropy approximation to incorporate data association constraints that are often ignored in previous probabilistic tracking algorithms. Noteworthy aspects of our method include a model-based mechanism to replace heuristic logic typically used to initiate and destroy tracks, and an assignment posterior with linear computation cost in window length as opposed to the exponential scaling of previous MAP-based approaches. We demonstrate the applicability of our method on radar tracking and computer vision problems. The field of tracking is broad and possesses many applications, particularly in radar/sonar [1], robotics [14], and computer vision [3].


Very loopy belief propagation for unwrapping phase images

Neural Information Processing Systems

Since the discovery that the best error-correcting decoding algo(cid:173) rithm can be viewed as belief propagation in a cycle-bound graph, researchers have been trying to determine under what circum(cid:173) stances "loopy belief propagation" is effective for probabilistic infer(cid:173) ence. Despite several theoretical advances in our understanding of loopy belief propagation, to our knowledge, the only problem that has been solved using loopy belief propagation is error-correcting decoding on Gaussian channels. We propose a new representation for the two-dimensional phase unwrapping problem, and we show that loopy belief propagation produces results that are superior to existing techniques. This is an important result, since many imag(cid:173) ing techniques, including magnetic resonance imaging and interfer(cid:173) ometric synthetic aperture radar, produce phase-wrapped images. Interestingly, the graph that we use has a very large number of very short cycles, supporting evidence that a large minimum cycle length is not needed for excellent results using belief propagation.


Validity Estimates for Loopy Belief Propagation on Binary Real-world Networks

Neural Information Processing Systems

We introduce a computationally efficient method to estimate the valid- ity of the BP method as a function of graph topology, the connectiv- ity strength, frustration and network size. We present numerical results that demonstrate the correctness of our estimates for the uniform random model and for a real-world network ("C. Although the method is restricted to pair-wise interactions, no local evidence (zero "biases") and binary variables, we believe that its predictions correctly capture the limitations of BP for inference and MAP estimation on arbitrary graphi- cal models. Using this approach, we find that BP always performs better than MF. Especially for large networks with broad degree distributions (such as scale-free networks) BP turns out to significantly outperform MF.